417 research outputs found

    Parameterizing by the Number of Numbers

    Full text link
    The usefulness of parameterized algorithmics has often depended on what Niedermeier has called, "the art of problem parameterization". In this paper we introduce and explore a novel but general form of parameterization: the number of numbers. Several classic numerical problems, such as Subset Sum, Partition, 3-Partition, Numerical 3-Dimensional Matching, and Numerical Matching with Target Sums, have multisets of integers as input. We initiate the study of parameterizing these problems by the number of distinct integers in the input. We rely on an FPT result for ILPF to show that all the above-mentioned problems are fixed-parameter tractable when parameterized in this way. In various applied settings, problem inputs often consist in part of multisets of integers or multisets of weighted objects (such as edges in a graph, or jobs to be scheduled). Such number-of-numbers parameterized problems often reduce to subproblems about transition systems of various kinds, parameterized by the size of the system description. We consider several core problems of this kind relevant to number-of-numbers parameterization. Our main hardness result considers the problem: given a non-deterministic Mealy machine M (a finite state automaton outputting a letter on each transition), an input word x, and a census requirement c for the output word specifying how many times each letter of the output alphabet should be written, decide whether there exists a computation of M reading x that outputs a word y that meets the requirement c. We show that this problem is hard for W[1]. If the question is whether there exists an input word x such that a computation of M on x outputs a word that meets c, the problem becomes fixed-parameter tractable

    Applying a Cut-Based Data Reduction Rule for Weighted Cluster Editing in Polynomial Time

    Get PDF
    Given an undirected graph, the task in Cluster Editing is to insert and delete a minimum number of edges to obtain a cluster graph, that is, a disjoint union of cliques. In the weighted variant each vertex pair comes with a weight and the edge modifications have to be of minimum overall weight. In this work, we provide the first polynomial-time algorithm to apply the following data reduction rule of Böcker et al. [Algorithmica, 2011] for Weighted Cluster Editing: For a graph G=(V,E)G = (V,E), merge a vertex set S⊆VS ⊆ V into a single vertex if the minimum cut of G[S] is at least the combined cost of inserting all missing edges within G[S] plus the cost of cutting all edges from S to the rest of the graph. Complementing our theoretical findings, we experimentally demonstrate the effectiveness of the data reduction rule, shrinking real-world test instances from the PACE Challenge 2021 by around 24% while previous heuristic implementations of the data reduction rule only achieve 8%

    Surface roughness during depositional growth and sublimation of ice crystals

    Get PDF
    Full version of an earlier discussion paper (Chou et al. 2018)Ice surface properties can modify the scattering properties of atmospheric ice crystals and therefore affect the radiative properties of mixed-phase and cirrus clouds. The Ice Roughness Investigation System (IRIS) is a new laboratory setup designed to investigate the conditions under which roughness develops on single ice crystals, based on their size, morphology and growth conditions (relative humidity and temperature). Ice roughness is quantified through the analysis of speckle in 2-D light-scattering patterns. Characterization of the setup shows that a supersaturation of 20 % with respect to ice and a temperature at the sample position as low as-40 °C could be achieved within IRIS. Investigations of the influence of humidity show that higher supersaturations with respect to ice lead to enhanced roughness and irregularities of ice crystal surfaces. Moreover, relative humidity oscillations lead to gradual ratcheting-up of roughness and irregularities, as the crystals undergo repeated growth-sublimation cycles. This memory effect also appears to result in reduced growth rates in later cycles. Thus, growth history, as well as supersaturation and temperature, influences ice crystal growth and properties, and future atmospheric models may benefit from its inclusion in the cloud evolution process and allow more accurate representation of not just roughness but crystal size too, and possibly also electrification properties.Peer reviewe

    A New Lower Bound on the Maximum Number of Satisfied Clauses in Max-SAT and its Algorithmic Applications

    Full text link
    A pair of unit clauses is called conflicting if it is of the form (x)(x), (xˉ)(\bar{x}). A CNF formula is unit-conflict free (UCF) if it contains no pair of conflicting unit clauses. Lieberherr and Specker (J. ACM 28, 1981) showed that for each UCF CNF formula with mm clauses we can simultaneously satisfy at least \pp m clauses, where \pp =(\sqrt{5}-1)/2. We improve the Lieberherr-Specker bound by showing that for each UCF CNF formula FF with mm clauses we can find, in polynomial time, a subformula Fâ€ČF' with mâ€Čm' clauses such that we can simultaneously satisfy at least \pp m+(1-\pp)m'+(2-3\pp)n"/2 clauses (in FF), where n"n" is the number of variables in FF which are not in Fâ€ČF'. We consider two parameterized versions of MAX-SAT, where the parameter is the number of satisfied clauses above the bounds m/2m/2 and m(5−1)/2m(\sqrt{5}-1)/2. The former bound is tight for general formulas, and the later is tight for UCF formulas. Mahajan and Raman (J. Algorithms 31, 1999) showed that every instance of the first parameterized problem can be transformed, in polynomial time, into an equivalent one with at most 6k+36k+3 variables and 10k10k clauses. We improve this to 4k4k variables and (25+4)k(2\sqrt{5}+4)k clauses. Mahajan and Raman conjectured that the second parameterized problem is fixed-parameter tractable (FPT). We show that the problem is indeed FPT by describing a polynomial-time algorithm that transforms any problem instance into an equivalent one with at most (7+35)k(7+3\sqrt{5})k variables. Our results are obtained using our improvement of the Lieberherr-Specker bound above

    The simplicity project: easing the burden of using complex and heterogeneous ICT devices and services

    Get PDF
    As of today, to exploit the variety of different "services", users need to configure each of their devices by using different procedures and need to explicitly select among heterogeneous access technologies and protocols. In addition to that, users are authenticated and charged by different means. The lack of implicit human computer interaction, context-awareness and standardisation places an enormous burden of complexity on the shoulders of the final users. The IST-Simplicity project aims at leveraging such problems by: i) automatically creating and customizing a user communication space; ii) adapting services to user terminal characteristics and to users preferences; iii) orchestrating network capabilities. The aim of this paper is to present the technical framework of the IST-Simplicity project. This paper is a thorough analysis and qualitative evaluation of the different technologies, standards and works presented in the literature related to the Simplicity system to be developed

    Cell wall characteristics during sexual reproduction of Mougeotia sp. (Zygnematophyceae) revealed by electron microscopy, glycan microarrays and RAMAN spectroscopy

    Get PDF
    Mougeotia spp. collected from field samples were investigated for their conjugation morphology by light-, fluorescence-, scanning- and transmission electron microscopy. During a scalarifom conjugation, the extragametangial zygospores were initially surrounded by a thin cell wall that developed into a multi-layered zygospore wall. Maturing zygospores turned dark brown and were filled with storage compounds such as lipids and starch. While M. parvula had a smooth surface, M. disjuncta had a punctated surface structure and a prominent suture. The zygospore wall consisted of a polysaccharide rich endospore, followed by a thin layer with a lipid-like appaerance, a massive electron dense mesospore and a very thin exospore composed of polysaccharides. Glycan microarray analysis of zygospores of different developmental stages revealed the occurrence of pectins and hemicelluloses, mostly composed of homogalacturonan (HG), xyloglucans, xylans, arabino-galactan proteins and extensins. In situ localization by the probe OG7-13AF 488 labelled HG in young zygospore walls, vegetative filaments and most prominently in conjugation tubes and cross walls. Raman imaging showed the distribution of proteins, lipids, carbohydrates and aromatic components of the mature zygospore with a spatial resolution of ~ 250 nm. The carbohydrate nature of the endo- and exospore was confirmed and in-between an enrichment of lipids and aromatic components, probably algaenan or a sporopollenin-like material. Taken together, these results indicate that during zygospore formation, reorganizations of the cell walls occured, leading to a resistant and protective structure

    Augmenting graphs to minimize the diameter

    Full text link
    We study the problem of augmenting a weighted graph by inserting edges of bounded total cost while minimizing the diameter of the augmented graph. Our main result is an FPT 4-approximation algorithm for the problem.Comment: 15 pages, 3 figure

    Polynomial fixed-parameter algorithms : a case study for longest path on interval graphs.

    Get PDF
    We study the design of fixed-parameter algorithms for problems already known to be solvable in polynomial time. The main motivation is to get more efficient algorithms for problems with unattractive polynomial running times. Here, we focus on a fundamental graph problem: Longest Path, that is, given an undirected graph, find a maximum-length path in G. Longest Path is NP-hard in general but known to be solvable in O(n4) time on n-vertex interval graphs. We show how to solve Longest Path on Interval Graphs, parameterized by vertex deletion number k to proper interval graphs, in O(k9n) time. Notably, Longest Path is trivially solvable in linear time on proper interval graphs, and the parameter value k can be approximated up to a factor of 4 in linear time. From a more general perspective, we believe that using parameterized complexity analysis may enable a refined understanding of efficiency aspects for polynomial-time solvable problems similarly to what classical parameterized complexity analysis does for NP-hard problems
    • 

    corecore